Efficient Sparse Recovery and Demixing Using Nonconvex Regularization

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Reduced Rank Regression With Nonconvex Regularization

In this paper, the estimation problem for sparse reduced rank regression (SRRR) model is considered. The SRRR model is widely used for dimension reduction and variable selection with applications in signal processing, econometrics, etc. The problem is formulated to minimize the least squares loss with a sparsity-inducing penalty considering an orthogonality constraint. Convex sparsity-inducing ...

متن کامل

A Graduated Nonconvex Regularization for Sparse High Dimensional Model Estimation

Many high dimensional data mining problems can be formulated as minimizing an empirical loss function with a penalty proportional to the number of variables required to describe a model. We propose a graduated non-convexification method to facilitate tracking of a global minimizer of this problem. We prove that under some conditions the proposed regularization problem using the continuous piece...

متن کامل

Nonconvex Sparse Logistic Regression with Weakly Convex Regularization

In this work we propose to fit a sparse logistic regression model by a weakly convex regularized nonconvex optimization problem. The idea is based on the finding that a weakly convex function as an approximation of the `0 pseudo norm is able to better induce sparsity than the commonly used `1 norm. For a class of weakly convex sparsity inducing functions, we prove the nonconvexity of the corres...

متن کامل

Support recovery without incoherence: A case for nonconvex regularization

We develop a new primal-dual witness proof framework that may be used to establish variable selection consistency and ∞-bounds for sparse regression problems, even when the loss function and regularizer are nonconvex. We use this method to prove two theorems concerning support recovery and ∞-guarantees for a regression estimator in a general setting. Notably, our theory applies to all potential...

متن کامل

Sparse Recovery via Partial Regularization: Models, Theory and Algorithms

In the context of sparse recovery, it is known that most of existing regularizers such as `1 suffer from some bias incurred by some leading entries (in magnitude) of the associated vector. To neutralize this bias, we propose a class of models with partial regularizers for recovering a sparse solution of a linear system. We show that every local minimizer of these models is sufficiently sparse o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2019

ISSN: 2169-3536

DOI: 10.1109/access.2019.2915311